Nonlinear factorization in sparsely encoded Hopfield-like neural networks

نویسندگان

  • Anton M. Sirota
  • Alexander A. Frolov
  • Dusan Húsek
چکیده

The problem of binary factorization of complex patterns in recurrent Hopfieldlike neural network was studied both theoretically and by means of computer simulation. The number and sparseness of factors mixed in patterns crucially determines the ability of an autoassociator to perform a factorization. Basing on experimental data on memory and learning one may suggest, that there exists a neural system of intermediate storage of information, which fulfills the function of binary factorization of the incoming polysensory information for its further effective storage in the form of elementary associatively bound factors. We suppose that field CA3 of the hippocampus possessing all properties of the autoassociative memory performs such function. This functional idea could be fruitfully applied to various memory related tasks (e.g. spatial navigation) and lead to some critical experiments.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Binary Factorization in Hopfield-like Neural Networks: Single-step Approximation and Computer Simulations

The unsupervised learning of feature extraction in high-dimesional patterns is a central problem for the neural network approach. Feature extraction is a procedure which maps original patterns into the feature (or factor) space of reduced dimension. In this paper we demonstrate that Hebbian learning in Hopfield-like neural network is a natural procedure for unsupervised learning of feature extr...

متن کامل

New Computer Algorithm Enabling Simulation of Recall Process in Sparsely Encoded Hopfield-like Neural Network of Extremely Large Size

The algorithm which minimizes the required computer memory to simulate neuro-dynamics in sparsely encoded Hoppeld-like autoassociative memory is proposed. It does not require to keep in memory both connection matrix and a set of stored prototypes. The algorithm allows to simulate the recall process in the neural network of extremely large size and thus to calculate its asymptotic informational ...

متن کامل

Noise Injection Into Inputs In Sparsely Connected Hopfield And Winner-take-all Neural Networks - Systems, Man and Cybernetics, Part B, IEEE Transactions on

In this paper, we show that noise injection into inputs in unsupervised learning neural networks does not improve their performance as it does in supervised learning neural networks. Specifically, we show that training noise degrades the classification ability of a sparsely connected version of the Hopfield neural network, whereas the performance of a sparsely connected winner-take-all neural n...

متن کامل

Noise injection into inputs in sparsely connected Hopfield and winner-take-all neural networks

In this paper, we show that noise injection into inputs in unsupervised learning neural networks does not improve their performance as it does in supervised learning neural networks. Specifically, we show that training noise degrades the classification ability of a sparsely connected version of the Hopfield neural network, whereas the performance of a sparsely connected winner-take-all neural n...

متن کامل

Transient Dynamics of Sparsely Connected Hopfield Neural Networks with Arbitrary Degree Distributions

Using probabilistic approach, the transient dynamics of sparsely connected Hopfield neural networks is studied for arbitrary degree distributions. A recursive scheme is developed to determine the time evolution of overlap parameters. As illustrative examples, the explicit calculations of dynamics for networks with binomial, powerlaw, and uniform degree distribution are performed. The results ar...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999